Stochastic Tree Ensembles for Regularized Nonlinear Regression
نویسندگان
چکیده
This article develops a novel stochastic tree ensemble method for nonlinear regression, referred to as accelerated Bayesian additive regression trees, or XBART. By combining regularization and search strategies from modeling with computationally efficient techniques recursive partitioning algorithms, XBART attains state-of-the-art performance at prediction function estimation. Simulation studies demonstrate that provides accurate point-wise estimates of the mean does so faster than popular alternatives, such BART, XGBoost, neural networks (using Keras) on variety test functions. Additionally, it is demonstrated using initialize standard BART MCMC algorithm considerably improves credible interval coverage reduces total run-time. Finally, two basic theoretical results are established: single version model asymptotically consistent Markov chain produced by has unique stationary distribution.
منابع مشابه
Regularized multivariate stochastic regression
In many high dimensional problems, the dependence structure among the variables can be quite complex. An appropriate use of the regularization techniques coupled with other classical statistical methods can often improve estimation and prediction accuracy and facilitate model interpretation, by seeking a parsimonious model representation that involves only the subset of revelent variables. We p...
متن کاملLazy Sparse Stochastic Gradient Descent for Regularized Mutlinomial Logistic Regression
Stochastic gradient descent efficiently estimates maximum likelihood logistic regression coefficients from sparse input data. Regularization with respect to a prior coefficient distribution destroys the sparsity of the gradient evaluated at a single example. Sparsity is restored by lazily shrinking a coefficient along the cumulative gradient of the prior just before the coefficient is needed. 1...
متن کاملKernel Quantile Regression for Nonlinear Stochastic Models
We consider kernel quantile estimates for drift and scale functions in nonlinear stochastic regression models. Under a general dependence setting, we establish asymptotic point-wise and uniform Bahadur representations for the kernel quantile estimates. Based on those asymptotic representations, central limit theorems are obtained. Applications to nonlinear autoregressive models and linear proce...
متن کاملRandom Subspacing for Regression Ensembles
In this work we present a novel approach to ensemble learning for regression models, by combining the ensemble generation technique of random subspace method with the ensemble integration methods of Stacked Regression and Dynamic Selection. We show that for simple regression methods such as global linear regression and nearest neighbours, this is a more effective method than the popular ensembl...
متن کاملSparse regularized local regression
The intention is to provide a Bayesian formulation of regularized local linear regression, combined with techniques for optimal bandwidth selection. This approach arises from the idea that only those covariates that are found to be relevant for the regression function should be considered by the kernel function used to define the neighborhood of the point of interest. However, the regression fu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of the American Statistical Association
سال: 2021
ISSN: ['0162-1459', '1537-274X', '2326-6228', '1522-5445']
DOI: https://doi.org/10.1080/01621459.2021.1942012